56 research outputs found

    Vision Egg: an Open-Source Library for Realtime Visual Stimulus Generation

    Get PDF
    Modern computer hardware makes it possible to produce visual stimuli in ways not previously possible. Arbitrary scenes, from traditional sinusoidal gratings to naturalistic 3D scenes can now be specified on a frame-by-frame basis in realtime. A programming library called the Vision Egg that aims to make it easy to take advantage of these innovations. The Vision Egg is a free, open-source library making use of OpenGL and written in the high-level language Python with extensions in C. Careful attention has been paid to the issues of luminance and temporal calibration, and several interfacing techniques to input devices such as mice, movement tracking systems, and digital triggers are discussed. Together, these make the Vision Egg suitable for many psychophysical, electrophysiological, and behavioral experiments. This software is available for free download at visionegg.org

    Contrast sensitivity of insect motion detectors to natural images

    Get PDF
    How do animals regulate self-movement despite large variation in the luminance contrast of the environment? Insects are capable of regulating flight speed based on the velocity of image motion, but the mechanisms for this are unclear. The Hassenstein–Reichardt correlator model and elaborations can accurately predict responses of motion detecting neurons under many conditions but fail to explain the apparent lack of spatial pattern and contrast dependence observed in freely flying bees and flies. To investigate this apparent discrepancy, we recorded intracellularly from horizontal-sensitive (HS) motion detecting neurons in the hoverfly while displaying moving images of natural environments. Contrary to results obtained with grating patterns, we show these neurons encode the velocity of natural images largely independently of the particular image used despite a threefold range of contrast. This invariance in response to natural images is observed in both strongly and minimally motion-adapted neurons but is sensitive to artificial manipulations in contrast. Current models of these cells account for some, but not all, of the observed insensitivity to image contrast. We conclude that fly visual processing may be matched to commonalities between natural scenes, enabling accurate estimates of velocity largely independent of the particular scene

    A `bright zone' in male hoverfly (Eristalis tenax) eyes and associated faster motion detection and increased contrast sensitivity

    Get PDF
    Eyes of the hoverfly Eristalis tenax are sexually dimorphic such that males have a fronto-dorsal region of large facets. In contrast to other large flies in which large facets are associated with a decreased interommatidial angle to form a dorsal `acute zone' of increased spatial resolution, we show that a dorsal region of large facets in males appears to form a `bright zone' of increased light capture without substantially increased spatial resolution. Theoretically, more light allows for increased performance in tasks such as motion detection. To determine the effect of the bright zone on motion detection, local properties of wide field motion detecting neurons were investigated using localized sinusoidal gratings. The pattern of local preferred directions of one class of these cells, the HS cells, in Eristalis is similar to that reported for the blowfly Calliphora. The bright zone seems to contribute to local contrast sensitivity; high contrast sensitivity exists in portions of the receptive field served by large diameter facet lenses of males and is not observed in females. Finally, temporal frequency tuning is also significantly faster in this frontal portion of the world, particularly in males, where it overcompensates for the higher spatial-frequency tuning and shifts the predicted local velocity optimum to higher speeds. These results indicate that increased retinal illuminance due to the bright zone of males is used to enhance contrast sensitivity and speed motion detector responses. Additionally, local neural properties vary across the visual world in a way not expected if HS cells serve purely as matched filters to measure yaw-induced visual motion

    Active and Passive Antennal Movements during Visually Guided Steering in Flying Drosophila

    Get PDF
    Insects use feedback from a variety of sensory modalities, including mechanoreceptors on their antennae, to stabilize the direction and speed of flight. Like all arthropod appendages, antennae not only supply sensory information but may also be actively positioned by control muscles. However, how flying insects move their antennae during active turns and how such movements might influence steering responses are currently unknown. Here we examined the antennal movements of flying Drosophila during visually induced turns in a tethered flight arena. In response to both rotational and translational patterns of visual motion, Drosophila actively moved their antennae in a direction opposite to that of the visual motion. We also observed two types of passive antennal movements: small tonic deflections of the antenna and rapid oscillations at wing beat frequency. These passive movements are likely the result of wing-induced airflow and increased in magnitude when the angular distance between the wing and the antenna decreased. In response to rotational visual motion, increases in passive antennal movements appear to trigger a reflex that reduces the stroke amplitude of the contralateral wing, thereby enhancing the visually induced turn. Although the active antennal movements significantly increased antennal oscillation by bringing the arista closer to the wings, it did not significantly affect the turning response in our head-fixed, tethered flies. These results are consistent with the hypothesis that flying Drosophila use mechanosensory feedback to detect changes in the wing induced airflow during visually induced turns and that this feedback plays a role in regulating the magnitude of steering responses

    Multi-camera Realtime 3D Tracking of Multiple Flying Animals

    Full text link
    Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in realtime - with minimal latency - opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behavior. Here we describe a new system capable of tracking the position and body orientation of animals such as flies and birds. The system operates with less than 40 msec latency and can track multiple animals simultaneously. To achieve these results, a multi target tracking algorithm was developed based on the Extended Kalman Filter and the Nearest Neighbor Standard Filter data association algorithm. In one implementation, an eleven camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behavior of freely flying animals. If combined with other techniques, such as `virtual reality'-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.Comment: pdfTeX using libpoppler 3.141592-1.40.3-2.2 (Web2C 7.5.6), 18 pages with 9 figure

    Visual control of flight speed in Drosophila melanogaster

    Get PDF
    Flight control in insects depends on self-induced image motion (optic flow), which the visual system must process to generate appropriate corrective steering maneuvers. Classic experiments in tethered insects applied rigorous system identification techniques for the analysis of turning reactions in the presence of rotating pattern stimuli delivered in open-loop. However, the functional relevance of these measurements for visual free-flight control remains equivocal due to the largely unknown effects of the highly constrained experimental conditions. To perform a systems analysis of the visual flight speed response under free-flight conditions, we implemented a `one-parameter open-loop' paradigm using `TrackFly' in a wind tunnel equipped with real-time tracking and virtual reality display technology. Upwind flying flies were stimulated with sine gratings of varying temporal and spatial frequencies, and the resulting speed responses were measured from the resulting flight speed reactions. To control flight speed, the visual system of the fruit fly extracts linear pattern velocity robustly over a broad range of spatio–temporal frequencies. The speed signal is used for a proportional control of flight speed within locomotor limits. The extraction of pattern velocity over a broad spatio–temporal frequency range may require more sophisticated motion processing mechanisms than those identified in flies so far. In Drosophila, the neuromotor pathways underlying flight speed control may be suitably explored by applying advanced genetic techniques, for which our data can serve as a baseline. Finally, the high-level control principles identified in the fly can be meaningfully transferred into a robotic context, such as for the robust and efficient control of autonomous flying micro air vehicles

    Neural responses to moving natural scenes.

    Get PDF
    Visual movement is important to most animals that move quickly, and even some that do not. What neural computations do animals use to see visual motion in their natural environment? The visual stimulus used to perform experiments on such questions is critical, and has historically limited the ability to perform experiments asking critical questions about responses to naturalistic moving scenes. The ability to display, at high frame rates, moving natural panoramas and other stimuli distorted to compensate for projection onto a flat screen was important to the experiments described here. I therefore created a software library called the 'Vision Egg' that allows creation of motion stimuli with recent, inexpensive computer hardware, and was used for the experiments described here. Additionally, I developed a mathematical model to determine the quality of motion simulation possible with computer displays. This model was applied to reach an understanding of the 'ghosting' artifact sometimes perceived on such apparent motion displays. Psychophysical experiments on human observers confirmed model predictions and allowed testing of synthetic motion blur for simulation of smooth motion and elimination of the ghosting artifact. I show this synthetic motion blur is optimal in the sense of creating the closest perception possible to that of smooth motion experienced in natural settings. Experiments on humans and flies show that such synthetic 'motion blur' has no effect on motion detection per se. However, ghosting in sampled displays results in information not present in smooth motion at high velocities, permitting inappropriate discrimination of rapidly moving features. I performed experiments measuring the responses of hoverfly wide-field motion detecting neurons (HS cells) in adapted and unadapted states to the velocity of natural scenes. Responses to natural images of varied intrinsic contrast depend little on the choice of image. Artificially reducing contrast, however, does reduce response magnitudes. Finally, the greatest component of response variation to natural scenes is directly related to local structure in the scenes, and could thus be called 'pattern noise.' The large receptive field of HS cells arises from a (non-linear) spatial summation of numerous elementary motion detectors. I measured spatial and temporal contrast sensitivity of small patches in the large receptive field. As predicted from the presence of a frontal optical acute zone, spatial tuning is highest frontally. A sexually dimorphic 'bright zone' in the frontodorsal eye is correlated with enhanced contrast sensitivity and faster temporal tuning in HS cells with receptive fields in this region of male flies.Thesis (Ph.D.) -- University of Adelaide, School of Molecular and Biomedical Science, 200

    Visual Control of Altitude in Flying Drosophila

    Get PDF
    Unlike creatures that walk, flying animals need to control their horizontal motion as well as their height above the ground. Research on insects, the first animals to evolve flight, has revealed several visual reflexes that are used to govern horizontal course. For example, insects orient toward prominent vertical features in their environment [1], [2], [3], [4] and [5] and generate compensatory reactions to both rotations [6] and [7] and translations [1], [8], [9], [10] and [11] of the visual world. Insects also avoid impending collisions by veering away from visual expansion [9], [12], [13] and [14]. In contrast to this extensive understanding of the visual reflexes that regulate horizontal course, the sensory-motor mechanisms that animals use to control altitude are poorly understood. Using a 3D virtual reality environment, we found that Drosophila utilize three reflexes—edge tracking, wide-field stabilization, and expansion avoidance—to control altitude. By implementing a dynamic visual clamp, we found that flies do not regulate altitude by maintaining a fixed value of optic flow beneath them, as suggested by a recent model [15]. The results identify a means by which insects determine their absolute height above the ground and uncover a remarkable correspondence between the sensory-motor algorithms used to regulate motion in the horizontal and vertical domains

    Integrative Model of Drosophila Flight

    Get PDF
    This paper presents a framework for simulating the flight dynamics and control strategies of the fruit fly Drosophila melanogaster. The framework consists of five main components: an articulated rigid-body simulation, a model of the aerodynamic forces and moments, a sensory systems model, a control model, and an environment model. In the rigid-body simulation the fly is represented by a system of three rigid bodies connected by a pair of actuated ball joints. At each instant of the simulation, the aerodynamic forces and moments acting on the wings and body of the fly are calculated using an empirically derived quasi-steady model. The pattern of wing kinematics is based on data captured from high-speed video sequences. The forces and moments produced by the wings are modulated by deforming the base wing kinematics along certain characteristic actuation modes. Models of the fly’s visual and mechanosensory systems are used to generate inputs to a controller that sets the magnitude of each actuation mode, thus modulating the forces produced by the wings. This simulation framework provides a quantitative test bed for examining the possible control strategies employed by flying insects. Examples demonstrating pitch rate, velocity, altitude, and flight speed control, as well as visually guided centering in a corridor are presented
    corecore